Neural Integration of Multimodal Events
نویسندگان
چکیده
Our actions produce physical events that give rise to perceptual information across multiple modalities. A trip to the beach can be identified through all five of our senses: seeing the waves roll in from the horizon, hearing the waves break on the sand, smelling the ocean life, feeling the cool sensation of the water on our skin, and tasting the salty water. Yet information from one perceptual modality is not experienced separately from information in another modality; instead, our perception of the ocean blends the modality-specific information from our senses into an integrated, coherent percept. In a series of three experiments, we address how such percepts come to be; that is, what cues in the signal are used to combine different sensory information and, concurrently, what are the neural processing mechanisms sensitive to these integration cues. Multimodal interactions have been found in behavioral and neural research. In both cases, multimodal stimuli generate responses that are not the simple summation of their unimodal counterparts. This finding led researchers to examine what stimulus properties enable the efficient integration of information across modalities, and this research has identified three integration cues – time, space, and semantics – that influence the process of integration. In addition, research with both animals and humans reveal a consistent network of brain regions in posterior, temporal, and frontal cortices recruited during multimodal processing. However, the functional role of the regions within the extended multimodal network is relatively unknown. Consequently, our studies independently manipulate temporal and semantic congruency using a novel stimulus set of environmental events. The studies ask whether the two integration cues share the same neural resources or recruit separable neural substrates. We also explore how these two integration cues interact with one another, and in particular, whether the timecourse of one cue differs from the other. That is, does a low-level cue like temporal synchrony that is carried in the perceptual signals themselves influence multimodal processing earlier than a higher-order cue like semantic congruency that relies on stored, experiential knowledge. In short, the series of three studies aim to unravel the neural basis of multimodal integration. Behavioral research has highlighted three major characteristics of multimodal interactions. First, the response times for detection or identification of a stimulus are faster for multimodal stimuli than for unimodal stimuli. Second, information from one modality can bias another modality, resulting in perceptual illusions that highlight the role of multimodal integration …
منابع مشابه
The integration of audio−tactile information is modulated by multimodal social interaction with physical contact in infancy
Interaction between caregivers and infants is multimodal in nature. To react interactively and smoothly to such multimodal signals, infants must integrate all these signals. However, few empirical infant studies have investigated how multimodal social interaction with physical contact facilitates multimodal integration, especially regarding audio - tactile (A-T) information. By using electroenc...
متن کاملMultimodal Integration of Micro-Doppler Sonar and auditory signals for Behavior Classification with convolutional Networks
The ability to recognize the behavior of individuals is of great interest in the general field of safety (e.g. building security, crowd control, transport analysis, independent living for the elderly). Here we report a new real-time acoustic system for human action and behavior recognition that integrates passive audio and active micro-Doppler sonar signatures over multiple time scales. The sys...
متن کاملTowards Computational Modelling of Neural Multimodal Integration Based on the Superior Colliculus Concept
Information processing and responding to sensory input with appropriate actions are among the most important capabilities of the brain and the brain has specific areas that deal with auditory or visual processing. The auditory information is sent first to the cochlea, then to the inferior colliculus area and then later to the auditory cortex where it is further processed so that then eyes, head...
متن کاملFramework for Semantic Integration and Scalable Processing of City Traffic Events
Marupudi, Surendra Brahma. M.S., Department of Computer Science and Engineering, Wright State University, 2016. Framework for Semantic Integration and Scalable Processing of City Traffic Events. Intelligent traffic management requires analysis of a large volume of multimodal data from diverse domains. For the development of intelligent traffic applications, we need to address diversity in obser...
متن کاملAn Integration Principle for Multimodal Sensor Data Based on Temporal Coherence of Self-Organized Patterns
The world around us offers continuously huge amounts of information, from which living organisms can elicit the knowledge and understanding they need for survival or well-being. A fundamental cognitive feature, that makes this possible is the ability of a brain to integrate the inputs it receives from different sensory modalities into a coherent description of its surrounding environment. By an...
متن کامل